928 research outputs found

    Photoelastic force measurements in granular materials

    Full text link
    Photoelastic techniques are used to make both qualitative and quantitative measurements of the forces within idealized granular materials. The method is based on placing a birefringent granular material between a pair of polarizing filters, so that each region of the material rotates the polarization of light according to the amount of local of stress. In this review paper, we summarize past work using the technique, describe the optics underlying the technique, and illustrate how it can be used to quantitatively determine the vector contact forces between particles in a 2D granular system. We provide a description of software resources available to perform this task, as well as key techniques and resources for building an experimental apparatus

    Information Management to Mitigate Loss of Control Airline Accidents

    Get PDF
    Loss of control inflight continues to be the leading contributor to airline accidents worldwide and unreliable airspeed has been a contributing factor in many of these accidents. Airlines and the FAA developed training programs for pilot recognition of these airspeed events and many checklists have been designed to help pilots troubleshoot. In addition, new aircraft designs incorporate features to detect and respond in such situations. NASA has been using unreliable airspeed events while conducting research recommended by the Commercial Aviation Safety Team. Even after significant industry focus on unreliable airspeed, research and other evidence shows that highly skilled and trained pilots can still be confused by the condition and there is a lack of understanding of what the associated checklist(s) attempts to uncover. Common mode failures of analog sensors designed for measuring airspeed continue to confound both humans and automation when determining which indicators are correct. This paper describes failures that have occurred in the past and where/how pilots may still struggle in determining reliable airspeed when confronted with conflicting information. Two latest generation aircraft architectures will be discussed and contrasted. This information will be used to describe why more sensors used in classic control theory will not solve the problem. Technology concepts are suggested for utilizing existing synoptic pages and a new synoptic page called System Interactive Synoptic (SIS). SIS details the flow of flight critical data through the avionics system and how it is used by the automation. This new synoptic page as well as existing synoptics can be designed to be used in concert with a simplified electronic checklist (sECL) to significantly reduce the time to configure the flight deck avionics in the event of a system or sensor failure

    Evaluation of Technology Concepts for Energy, Automation, and System State Awareness in Commercial Airline Flight Decks

    Get PDF
    A pilot-in-the-loop flight simulation study was conducted at NASA Langley Research Center to evaluate flight deck systems that (1) provide guidance for recovery from low energy states and stalls, (2) present the current state and expected future state of automated systems, and/or (3) show the state of flight-critical data systems in use by automated systems and primary flight instruments. The study was conducted using 13 commercial airline crews from multiple airlines, paired by airline to minimize procedural effects. Scenarios spanned a range of complex conditions and several emulated causal and contributing factors found in recent accidents involving loss of state awareness by pilots (e.g., energy state, automation state, and/or system state). Three new technology concepts were evaluated while used in concert with current state-of-the-art flight deck systems and indicators. The technologies include a stall recovery guidance algorithm and display concept, an enhanced airspeed control indicator that shows when automation is no longer actively controlling airspeed, and enhanced synoptic pages designed to work with simplified interactive electronic checklists. An additional synoptic was developed to provide the flight crew with information about the effects of loss of flight critical data. Data was collected via questionnaires administered at the completion of flight scenarios, audio/video recordings, flight data, head and eye tracking data, pilot control inputs, and researcher observations. This paper presents findings derived from the questionnaire responses and subjective data measures including workload, situation awareness, usability, and acceptability as well as analyses of two low-energy flight events that resulted in near-stall conditions

    Usability Evaluation of Indicators of Energy-Related Problems in Commercial Airline Flight Decks

    Get PDF
    A series of pilot-in-the-loop flight simulation studies were conducted at NASA Langley Research Center to evaluate indicators aimed at supporting the flight crews awareness of problems related to energy states. Indicators were evaluated utilizing state-of-the-art flight deck systems such as on commercial air transport aircraft. This paper presents results for four technologies: (1) conventional primary flight display speed cues, (2) an enhanced airspeed control indicator, (3) a synthetic vision baseline that provides a flight path vector, speed error, and an acceleration cue, and (4) an aural airspeed alert that triggers when current airspeed deviates beyond a specified threshold from the selected airspeed. Full-mission high-fidelity flight simulation studies were conducted using commercial airline crews. Crews were paired by airline for common crew resource management procedures and protocols. Scenarios spanned a range of complex conditions while emulating several causal factors reported in recent accidents involving loss of energy state awareness by pilots. Data collection included questionnaires administered at the completion of flight scenarios, aircraft state data, audio/video recordings of flight crew, eye tracking, pilot control inputs, and researcher observations. Questionnaire response data included subjective measures of workload, situation awareness, complexity, usability, and acceptability. This paper reports relevant findings derived from subjective measures as well as quantitative measures

    Proactive esophageal cooling protects against thermal insults during high-power short-duration radiofrequency cardiac ablation

    Full text link
    [EN] Background Proactive cooling with a novel cooling device has been shown to reduce endoscopically identified thermal injury during radiofrequency (RF) ablation for the treatment of atrial fibrillation using medium power settings. We aimed to evaluate the effects of proactive cooling during high-power short-duration (HPSD) ablation. Methods A computer model accounting for the left atrium (1.5 mm thickness) and esophagus including the active cooling device was created. We used the Arrhenius equation to estimate the esophageal thermal damage during 50 W/ 10 s and 90 W/ 4 s RF ablations. Results With proactive esophageal cooling in place, temperatures in the esophageal tissue were significantly reduced from control conditions without cooling, and the resulting percentage of damage to the esophageal wall was reduced around 50%, restricting damage to the epi-esophageal region and consequently sparing the remainder of the esophageal tissue, including the mucosal surface. Lesions in the atrial wall remained transmural despite cooling, and maximum width barely changed (<0.8 mm). Conclusions Proactive esophageal cooling significantly reduces temperatures and the resulting fraction of damage in the esophagus during HPSD ablation. These findings offer a mechanistic rationale explaining the high degree of safety encountered to date using proactive esophageal cooling, and further underscore the fact that temperature monitoring is inadequate to avoid thermal damage to the esophagus.Research reported in this publication was supported by the National Heart, Lung, And Blood Institute of the National Institutes of Health under Award Number R44HL158375 (the content is solely the responsibility of the authors and does not necessarily represent the official views of the National Institutes of Health) and by the Spanish Ministerio de Ciencia, Innovacion y Universidades/Agencia Estatal de Investigacion (MCIN/AEI/10.13039/501100011033 under grant RTI2018-094357-B-C21).Mercado Montoya, M.; Gomez Bustamante, T.; Berjano, E.; Mickelsen, SR.; Daniels, JD.; Hernández Arango, P.; Schieber, J.... (2022). Proactive esophageal cooling protects against thermal insults during high-power short-duration radiofrequency cardiac ablation. International Journal of Hyperthermia. 39(1):1202-1212. https://doi.org/10.1080/02656736.2022.21218601202121239

    A damage model based on failure threshold weakening

    Full text link
    A variety of studies have modeled the physics of material deformation and damage as examples of generalized phase transitions, involving either critical phenomena or spinodal nucleation. Here we study a model for frictional sliding with long range interactions and recurrent damage that is parameterized by a process of damage and partial healing during sliding. We introduce a failure threshold weakening parameter into the cellular-automaton slider-block model which allows blocks to fail at a reduced failure threshold for all subsequent failures during an event. We show that a critical point is reached beyond which the probability of a system-wide event scales with this weakening parameter. We provide a mapping to the percolation transition, and show that the values of the scaling exponents approach the values for mean-field percolation (spinodal nucleation) as lattice size LL is increased for fixed RR. We also examine the effect of the weakening parameter on the frequency-magnitude scaling relationship and the ergodic behavior of the model

    When the optimal is not the best: parameter estimation in complex biological models

    Get PDF
    Background: The vast computational resources that became available during the past decade enabled the development and simulation of increasingly complex mathematical models of cancer growth. These models typically involve many free parameters whose determination is a substantial obstacle to model development. Direct measurement of biochemical parameters in vivo is often difficult and sometimes impracticable, while fitting them under data-poor conditions may result in biologically implausible values. Results: We discuss different methodological approaches to estimate parameters in complex biological models. We make use of the high computational power of the Blue Gene technology to perform an extensive study of the parameter space in a model of avascular tumor growth. We explicitly show that the landscape of the cost function used to optimize the model to the data has a very rugged surface in parameter space. This cost function has many local minima with unrealistic solutions, including the global minimum corresponding to the best fit. Conclusions: The case studied in this paper shows one example in which model parameters that optimally fit the data are not necessarily the best ones from a biological point of view. To avoid force-fitting a model to a dataset, we propose that the best model parameters should be found by choosing, among suboptimal parameters, those that match criteria other than the ones used to fit the model. We also conclude that the model, data and optimization approach form a new complex system, and point to the need of a theory that addresses this problem more generally

    Nipah Virus Infection in Dogs, Malaysia, 1999

    Get PDF
    The 1999 outbreak of Nipah virus encephalitis in humans and pigs in Peninsular Malaysia ended with the evacuation of humans and culling of pigs in the epidemic area. Serologic screening showed that, in the absence of infected pigs, dogs were not a secondary reservoir for Nipah virus

    Curriculum-making in school and college: The case of hospitality

    Get PDF
    Drawing upon research in the curriculum of Hospitality, this article explores the contrasting ways in which the prescribed curriculum is translated into the enacted curriculum is school and college contexts. It identifies organisational culture and teacher and student backgrounds and dispositions as central to the emerging contrasts. It uses this evidence to argue that the evolution of credit frameworks which assume a rational curriculum is unhelpful in understanding the multiple plays of difference in learning and the enacted curriculu
    • …
    corecore